13 research outputs found

    Evolving Social Networks via Friend Recommendations

    Full text link
    A social network grows over a period of time with the formation of new connections and relations. In recent years we have witnessed a massive growth of online social networks like Facebook, Twitter etc. So it has become a problem of extreme importance to know the destiny of these networks. Thus predicting the evolution of a social network is a question of extreme importance. A good model for evolution of a social network can help in understanding the properties responsible for the changes occurring in a network structure. In this paper we propose such a model for evolution of social networks. We model the social network as an undirected graph where nodes represent people and edges represent the friendship between them. We define the evolution process as a set of rules which resembles very closely to how a social network grows in real life. We simulate the evolution process and show, how starting from an initial network, a network evolves using this model. We also discuss how our model can be used to model various complex social networks other than online social networks like political networks, various organizations etc..Comment: 5 pages, 8 figures, 2 algorithm

    Sixsoid: A new paradigm for kk-coverage in 3D Wireless Sensor Networks

    Full text link
    Coverage in 3D wireless sensor network (WSN) is always a very critical issue to deal with. Coming up with good coverage models implies more energy efficient networks. KK-coverage is one model that ensures that every point in a given 3D Field of Interest (FoI) is guaranteed to be covered by kk sensors. When it comes to 3D, coming up with a deployment of sensors that gurantees kk-coverage becomes much more complicated than in 2D. The basic idea is to come up with a geometrical shape that is guaranteed to be kk-covered by taking a specific arrangement of sensors, and then fill the FoI will non-overlapping copies of this shape. In this work, we propose a new shape for the 3D scenario which we call a \textbf{Devilsoid}. Prior to this work, the shape which was proposed for coverage in 3D was the so called \textbf{Reuleaux Tetrahedron}. Our construction is motivated from a construction that can be applied to the 2D version of the problem \cite{MS} in which it imples better guarantees over the \textbf{Reuleaux Triangle}. Our contribution in this paper is twofold, firstly we show how Devilsoid gurantees more coverage volume over Reuleaux Tetrahedron, secondly we show how Devilsoid also guarantees simpler and more pragmatic deployment strategy for 3D wireless sensor networks. In this paper, we show the constuction of Devilsoid, calculate its volume and discuss its effect on the kk-coverage in WSN

    Maximum Matchings via Glauber Dynamics

    Full text link
    In this paper we study the classic problem of computing a maximum cardinality matching in general graphs G=(V,E)G = (V, E). The best known algorithm for this problem till date runs in O(mn)O(m \sqrt{n}) time due to Micali and Vazirani \cite{MV80}. Even for general bipartite graphs this is the best known running time (the algorithm of Karp and Hopcroft \cite{HK73} also achieves this bound). For regular bipartite graphs one can achieve an O(m)O(m) time algorithm which, following a series of papers, has been recently improved to O(nlogn)O(n \log n) by Goel, Kapralov and Khanna (STOC 2010) \cite{GKK10}. In this paper we present a randomized algorithm based on the Markov Chain Monte Carlo paradigm which runs in O(mlog2n)O(m \log^2 n) time, thereby obtaining a significant improvement over \cite{MV80}. We use a Markov chain similar to the \emph{hard-core model} for Glauber Dynamics with \emph{fugacity} parameter λ\lambda, which is used to sample independent sets in a graph from the Gibbs Distribution \cite{V99}, to design a faster algorithm for finding maximum matchings in general graphs. Our result crucially relies on the fact that the mixing time of our Markov Chain is independent of λ\lambda, a significant deviation from the recent series of works \cite{GGSVY11,MWW09, RSVVY10, S10, W06} which achieve computational transition (for estimating the partition function) on a threshold value of λ\lambda. As a result we are able to design a randomized algorithm which runs in O(mlog2n)O(m\log^2 n) time that provides a major improvement over the running time of the algorithm due to Micali and Vazirani. Using the conductance bound, we also prove that mixing takes Ω(mk)\Omega(\frac{m}{k}) time where kk is the size of the maximum matching.Comment: It has been pointed to us independently by Yuval Peres, Jonah Sherman, Piyush Srivastava and other anonymous reviewers that the coupling used in this paper doesn't have the right marginals because of which the mixing time bound doesn't hold, and also the main result presented in the paper. We thank them for reading the paper with interest and promptly pointing out this mistak

    The Entropy Influence Conjecture Revisited

    Full text link
    In this paper, we prove that most of the boolean functions, f:{1,1}n{1,1}f : \{-1,1\}^n \rightarrow \{-1,1\} satisfy the Fourier Entropy Influence (FEI) Conjecture due to Friedgut and Kalai (Proc. AMS'96). The conjecture says that the Entropy of a boolean function is at most a constant times the Influence of the function. The conjecture has been proven for families of functions of smaller sizes. O'donnell, Wright and Zhou (ICALP'11) verified the conjecture for the family of symmetric functions, whose size is 2n+12^{n+1}. They are in fact able to prove the conjecture for the family of dd-part symmetric functions for constant dd, the size of whose is 2O(nd)2^{O(n^d)}. Also it is known that the conjecture is true for a large fraction of polynomial sized DNFs (COLT'10). Using elementary methods we prove that a random function with high probability satisfies the conjecture with the constant as (2+δ)(2 + \delta), for any constant δ>0\delta > 0.Comment: We thank Kunal Dutta and Justin Salez for pointing out that our result can be extended to a high probability statemen

    EvoCut : A new Generalization of Albert-Barab\'asi Model for Evolution of Complex Networks

    Get PDF
    With the evolution of social networks, the network structure shows dynamic nature in which nodes and edges appear as well as disappear for various reasons. The role of a node in the network is presented as the number of interactions it has with the other nodes. For this purpose a network is modeled as a graph where nodes represent network members and edges represent a relationship among them. Several models for evolution of social networks has been proposed till date, most widely accepted being the Barab\'asi-Albert \cite{Network science} model that is based on \emph{preferential attachment} of nodes according to the degree distribution. This model leads to generation of graphs that are called \emph{Scale Free} and the degree distribution of such graphs follow the \emph{power law}. Several generalizations of this model has also been proposed. In this paper we present a new generalization of the model and attempt to bring out its implications in real life

    EvoCut: A new Generalization of Albert-Barabasi Model for Evolution of Complex Networks

    Get PDF
    With the evolution of social networks, the network structure shows dynamic nature in which nodes and edges appear as well as disappear for various reasons. The role of a node in the network is presented as the number of interactions it has with the other nodes. For this purpose a network is modeled as a graph where nodes represent network members and edges represent a relationship among them. Several models for evolution of social networks has been proposed till date, most widely accepted being the Barabasi-Albert [1] model that is based on preferential attachment of nodes according to the degree distribution. This model leads to generation of graphs that are called Scale Free and the degree distribution of such graphs follow the power law. Several generalizations of this model has also been proposed. In this paper we present a new generalization of the model and attempt to bring out its implications in real life
    corecore